Members
Overall Objectives
Research Program
Software and Platforms
New Results
Partnerships and Cooperations
Dissemination
Bibliography
XML PDF e-pub
PDF e-Pub


Section: New Results

Mean field approaches

Asymptotic description of neural networks with correlated synaptic weights

Participants : Olivier Faugeras, James Maclaurin.

We study the asymptotic law of a network of interacting neurons when the number of neurons becomes infinite. Given a completely connected network of neurons in which the synaptic weights are Gaussian correlated random variables, we describe the asymptotic law of the network when the number of neurons goes to infinity. We introduce the process-level empirical measure of the trajectories of the solutions to the equations of the finite network of neurons and the averaged law (with respect to the synaptic weights) of the trajectories of the solutions to the equations of the network of neurons. The main result of this work is that the image law through the empirical measure satisfies a large deviation principle with a good rate function which is shown to have a unique global minimum. Our analysis of the rate function allows us also to characterize the limit measure as the image of a stationary Gaussian measure defined on a transformed set of trajectories. This work is available on ArXiV and is under review for a Journal. A preliminary version has been presented at the CNS meeting [42] .

Beyond dynamical mean-field theory of neural networks

Participants : Bruno Cessac, Massimiliano Muratori.

We consider a set of N firing rate neurons with discrete time dynamics and a leak term. The nonlinearity of the sigmoid is controlled by a parameter and each neuron has a firing threshold, Gaussian distributed (thresholds are uncorrelated). The network is fully connected with correlated Gaussian random synaptic weights, with mean zero and covariance matrix. When synaptic weights are uncorrelated the dynamic mean field theory allows us to draw the bifurcation diagram of the model in the thermodynamic limit (N tending to infinity): in particular there is sharp transition from fixed point to chaos characterized by the maximum Lyapunov exponent, which is known analytically in the thermodynamic limit. However, mean-field theory is exact only in the thermodynamic limit and when synaptic weights are uncorrelated. What are the deviations from mean-field theory observed when one departs from these hypotheses ? We have first studied the finite size dynamics. For finite N the maximal Lyapunov exponent has a plateau at 0 corresponding to a transition to chaos by quasi-periodicity where dynamics is at the edge of chaos. This plateau disappears in the thermodynamic limit. Thus, mean-field theory neglects an important finite-sized effect since neuronal dynamics at the edge of chaos has strong implications on learning performances of the network. We also studied the effect of a weak correlation on dynamics. Even when correlation is small, one detects an important deviation on the maximal Lyapunov exponent. This work has been presented at the CNS conference in Paris, 2013 [43] .